561 research outputs found
Characterization, through re-sequencing, of genetic variants associated with high altitude adaptation in North Caucasian ethnic groups.
The proposed Master Thesis refers to the results of the research work carried out at the Wellcome Trust Sanger Institute, Hinxton, UK, under the direct supervision of Dr. Chris Tyler-Smith, Head of the Human Evolution Team.
The aim of the study is to ascertain signals of positive selection in human populations living at high altitude using targeted next-generation DNA re-sequencing of candidate genes and control regions in 96 individuals. For this purpose 55 Daghestani from three ethnic groups were selected as case populations living at high altitudes (>2,000 msl). Adygei (n = 20), CEU (n = 20) and 1 chimpanzee sample were used as controls.
The Daghestani populations have a long history of permanence (>10.000 years) at high altitudes and their peculiar demographic history make them suitable for hypoxia adaptation studies. In order to disentangle selective and demographic effects on the genome, fifteen candidate genes involved in oxygen metabolism (HIF1α; PHD1; PHD2; PHD3; VHL; EPO; EPOr; VEGF; EDN1; NOS3; ACE; α,ÎČ,Îł,ÎŽ-Hb) were sequenced together with twenty-seven âcontrol regionsâ selected among those used in the Hominid Project (Wall, Cox et al. 2008) and the ENCODE3 Project
(http://www.genome.gov/10005107).
DNA samples obtained from whole blood or cell cultures were used to amplify the targeted genomic regions by long-template PCR. The amplimers were visualized by gel electrophoresis and those belonging to the same individuals were pooled together and subsequently purifiedusing the QIAquick PCR Purification Kit. Each pooled sample was indexed by adding an eight nucleotides tag, and 8 samples were combined and sequenced on each lane of an Illumina flow-cel. The sequenced DNA was sorted and aligned to the reference sequence using the MAQ algorithm (Li, Ruan et al. 2008). Parameters checking the goodness of the sequencing outputs as well as summary statistics and neutrality tests (Tajimaâs D, Fu and Liâs D and F, Fay and Wuâs H) were calculated on âPHASEDâ haplotype data. Candidate targets of positive selection (re-sequenced genes) and neutral regions were analyzed for putative signals of adaptation to hypoxia and several novel SNPs characterized. Particularly, four non synonymous mutations were found responsible for altering the functionality of HIF1α, ACE, EPOr and NOS3 and could be considered as putative candidate for hypoxia adaptation
Areal surface texture parameters on surface
Additive manufacturing (AM) techniques enable
the manufacture of components with free-form
geometries and complex internal and external
features. X-ray computed tomography (CT) is
increasingly being used to inspect internal features
of AM parts. An advantage of the CT process,
compared to optical and stylus instruments
with limited acquisition slope angles, is the ability
to reconstruct reentrant features (undercuts).
Processing reentrant features provides an advantage
in the computation of surface parameters.
If the surface includes many reentrant features,
their elimination can lead to a biased estimation
of parameters related to the height or the area
of the scale limited surface. A unified framework
capable of handling free-from surfaces, with
generic form surface, reentrant features and unevenly
spaced points, such as those from CT reconstruction,
will be proposed. Standard software
instruments employed for roughness parameter
require evaluation of height data on a rectangular
grid. This allows the computation of areal parameters
based on discrete methods with good
approximation, dependent upon the sample size.
The reconstruction from CT volume to mesh allows
performance of an adaptive meshing based
on the maximum allowable distance between the
implicit function (implicit surface defined by a constant
grey value) and the final triangular mesh [1].
With irregular meshes it is not possible to perform
the integral with the discrete approximation and
a bias on the parameters computation can arise.
In this paper an approach that approximates a
generic mesh based on locally refined (LR) Bspline
is proposed [2]. The approach can be applied
to a generic form surface because the local
stretching of the surface is taken into account.
Mesh parameterisation enables to handle undercuts,
each acquired point is described as a function
of two abstract parameters. In this paper the
proposed method will be compared with the discrete
(ISO 25178-2 compliant [3]) method implemented
in standard software packages [4]. Since
filtering techniques based on a general mesh are
not yet defined in the standard, the primary surfaces,
the surface after removing the form, will be
analysed. The areal parameters of a Rubert sample
(casting plate 334, nominal Ra of 25 ”m) measured
with a focus variation (FV) instrument will
be evaluated. Two form surfaces will be taken into
account: plane and cylinder. Robustness of the
discrete method will be finally evaluated with the
mesh reconstructed from two CT measurements:
the Rubert sample and an AM part
Curvature based sampling of curves and surfaces
Efficient sampling methods enable the reconstruction of a generic surface with a limited amount of points. The reconstructed surface can therefore be used for inspection purpose. In this paper a sampling method that enables the reconstruction of a curve or surface is proposed. The input of the proposed algorithm is the number of required samples. The method takes into account two factors: the regularity of the sampling and the complexity of the object. A higher density of samples is assigned where there are some significant features, described by the curvature. The analysed curves and surfaces are described through the B-splines spaces. The sampling of surfaces generated by two or more curves is also discussed
Hierarchical metamodeling: Cross validation and predictive uncertainty
At Esaform 2013 a hierarchical metamodeling approach had been presented, able to com- bine the results of numerical simulations and physical experiments into a unique response surface, which is a "fusion'' of both data sets. The method had been presented with respect to the structural optimization of a steel tube, filled with an aluminium foam, intended as an anti-intrusion bar. The prediction yielded by a conventional way of metamodeling the results of FEM simulations can be considered trustworthy only if the accuracy of numerical models have been thoroughly tested and the simulation parameters have been sufficiently calibrated. On the contrary, the main advantage of a hierarchical metamodel is to yield a reliable prediction of a response variable to be optimized, even in the presence of non-completely calibrated or accurate FEM models. In order to demonstrate these statements, in this paper the authors wish to compare the prediction ability of a "fusion'' metamodel based on under-calibrated simulations, with a conventional approach based on calibratedFEMresults. Both metamodels will be cross validated with a "leave-one-out'' technique, i.e. by excluding one ex- perimental observation at a time and assessing the predictive ability of the model. Furthermore, the paper will demonstrate how the hierarchical metamodel is able to provide not only an average esti- mated value for each excluded experimental observation, but also an estimation of uncertainty of the prediction of the average value
Recommended from our members
Through the layers of the Ethiopian genome: a survey of human genetic variation based on genome-wide genotyping and re-sequencing data
Understanding our evolutionary history as a species has since long been one of the most attracting and controversial themes of the scientific investigation. From its geographical position, outstanding fossil record and richness of human diversity, the Horn of Africa and, particularly, the Ethiopian region offers an unmatched opportunity to investigate our origins from a genetic perspective. To carry out a genome-wide survey of this region, 13 out of the estimated 80 extant Ethiopian populations were typed on an Illumina Omni 1M SNP array. The results showed a good concordance between genetic and linguistic stratification and, overall, a complex population structure placing the Ethiopians in between North and Sub Saharan Africans, due to the recent non African gene flow which was dated at around 3000 years ago. Furthermore the SNP array data unveiled putative traces of the out of Africa migrations as well as, in two of the typed populations, signatures of genetic adaptation to high altitude.
To obtain an unbiased, high resolution representation of the Ethiopian genetic landscape, 25 individuals from each of five populations were newly collected and sequenced on an Illumina HiSeq platform. These populations were chosen, from among the ones typed on the SNP array, to represent the main components of Ethiopian genetic diversity. Of the 25 samples per population, 24 were sequenced at low depth to generate a broad list of genetic variants, while one sample from each was sequenced at high depth to provide a higher resolution list of variants peculiar to each analysed population. The 125 Ethiopian genomes thus sequenced, while overall consistent with the genotyping results, described the Ethiopian populations in a less biased way than the SNP array data. Furthermore estimation of past effective population size fluctuations
from the individual genomes unveiled a unique pattern in the ancestry of the Ethiopian populations in the early stages of human evolution. These results provide a data resource which can be used in future analyses.This work was supported by: Domestic Research Studentship, Cambridge European Trust and Emmanuel College
Reduction of calibration effort in FEM-based optimization via numerical and experimental data fusion
In this paper a fusion metamodeling approach is suggested as a method for reducing the experimental and computational effort generally required for calibrating the parameters of FEM simulations models. The metamodel is used inside an optimization routine for linking data coming from two different sources: simulations and experiments. The method is applied to a real problem: the optimal design of a metal foam filled tube to be used as an anti-intrusion bar in vehicles. The model is hierarchical, in the sense that one set of data (the experiments) is considered to be more reliable and it is labeled as âhigh-fidelityâ and the other set (the simulations) is labeled as âlow-fidelityâ. In the proposed approach, Gaussian models are used to describe results of computer experiments because they are flexible and they can easily interpolate data coming from deterministic simulations. Since the results of experiments are obviously fully accurate, but aleatory, a second stage (âlinkageâ) model is used, which adjusts the prediction provided by the first model to more accurately represent the real experimental data. In the paper, the modeling and prediction ability of the method is first demonstrated and explained by means of artificially generated data and then applied to the optimization of foam filled tubular structures. The fusion metamodel yields comparable predictions (and optimal solution) if built over calibrated simulations vs. non-calibrated FEM model
A new approach to muon g-2 with space-like data: analysis and fitting procedure
In the thesis I have studied a new method to measure the Hadronic Leading-Order (HLO) contribution to the muon anomalous magnetic moment (a_mu). In the first part I have described the Standard Model approach to evaluate a_mu. From the perturbative expansion we know that there are many contributions to a_mu: QED, electroweak and hadronic. While the QED and the electroweak contributions can be calculated
with increasing precision with the perturbative method, the hadronic contributions at this energy scale cannot, hence experiments are required to estimate it. It is known since long time that a_mu can be calculated by means of the dispersive approach using electron positron annihilation data. I have studied the this method and discussed why this method could hardly further improve the theoretical uncertainty. In the second part of the thesis I studied an innovative proposal to measure the HLO contribution to a_mu. The novel method is based on the idea of using the elastic scattering of high energetic muons on at rest electrons. The strength of the idea is to rely on t-channel scattering data, to measure the HLO to a_mu. In this case the differential elastic crosssection allows to measure the running of α(t) with very high precision and to determine the hadronic shift âαhad(t) subtracting all the contributions due to QED and electroweak. Through âαhad(t) the HLO contribution to a_mu can be calculated integrating a smooth function, of the transferred momentum, exploiting just a single scattering process. I have performed a preliminary study of the fitting procedure to extract âαhad(t). At this stage I used a sample of data obtained using the Leading-Order approximation of the scattering cross-section. According to the present estimates this new approach, which represents an independent complementary technique to evaluate the HLO corrections to a_mu, will reach a precision competitive with the precision of the present results, in just two
years of data taking
- âŠ